904 research outputs found

    On Foundation of the Generalized Nambu Mechanics

    Full text link
    We outline the basic principles of canonical formalism for the Nambu mechanics---a generalization of Hamiltonian mechanics proposed by Yoichiro Nambu in 1973. It is based on the notion of Nambu bracket which generalizes the Poisson bracket to the multiple operation of higher order n≄3n \geq 3 on classical observables and is described by Hambu-Hamilton equations of motion given by n−1n-1 Hamiltonians. We introduce the fundamental identity for the Nambu bracket which replaces Jacobi identity as a consistency condition for the dynamics. We show that Nambu structure of given order defines a family of subordinated structures of lower order, including the Poisson structure, satisfying certain matching conditions. We introduce analogs of action from and principle of the least action for the Nambu mechanics and show how dynamics of loops (n−2n-2-dimensional objects) naturally appears in this formalism. We discuss several approaches to the quantization problem and present explicit representation of Nambu-Heisenberg commutation relation for n=3n=3 case. We emphasize the role higher order algebraic operations and mathematical structures related with them play in passing from Hamilton's to Nambu's dynamical picture.Comment: 27 page

    On Mean Pose and Variability of 3D Deformable Models

    Get PDF
    International audienceWe present a novel methodology for the analysis of complex object shapes in motion observed by multiple video cameras. In particular, we propose to learn local surface rigidity probabilities (i.e., deformations), and to estimate a mean pose over a temporal sequence. Local deformations can be used for rigidity-based dynamic surface segmentation, while a mean pose can be used as a sequence keyframe or a cluster prototype and has therefore numerous applications, such as motion synthesis or sequential alignment for compression or morphing. We take advantage of recent advances in surface tracking techniques to formulate a generative model of 3D temporal sequences using a probabilistic framework, which conditions shape fitting over all frames to a simple set of intrinsic surface rigidity properties. Surface tracking and rigidity variable estimation can then be formulated as an Expectation-Maximization inference problem and solved by alternatively minimizing two nested fixed point iterations. We show that this framework provides a new fundamental building block for various applications of shape analysis, and achieves comparable tracking performance to state of the art surface tracking techniques on real datasets, even compared to approaches using strong kinematic priors such as rigid skeletons

    Polymeric Separation Media: Binding of a§ unsaturated Carbonyl Compounds to Insoluble Resins through Michael Additions or Chelation of Derivatives

    Get PDF
    This is the publisher's version, also available electronically from "http://www.degruyter.com"

    Inertie des stratĂ©gies de protection de l’innovation

    Get PDF
    Cet article vise Ă  intĂ©grer une dimension temporelle dans l’analyse des choix de protection de l’innovation. La littĂ©rature existante porte principalement sur le choix de la mĂ©thode par l’entreprise : mĂ©thodes formelles (principalement le brevet) et mĂ©thodes informelles (secret, rapiditĂ© de mise sur le marchĂ© et complexitĂ© du design). Plusieurs travaux ont mis en Ă©vidence les facteurs de choix des stratĂ©gies de protection et plus prĂ©cisĂ©ment l’importance de la taille de l’entreprise, du recours Ă  des coopĂ©rations, des dĂ©penses liĂ©es Ă  la R&D, de la taille du marchĂ© et du secteur d’appartenance. Nous avons donc prolongĂ© cet apport en mobilisant des notions existantes telles que le chemin de dĂ©pendance et les phĂ©nomĂšnes d’escalade qui soulignent que les organisations font frĂ©quemment preuve d\u27une faible rĂ©activitĂ© aux transformations de leur environnement. Elles paraissent notamment avoir du mal Ă  rĂ©duire des investissements dont les performances sont dĂ©cevantes. Et plus gĂ©nĂ©ralement, l\u27engagement dans un choix semble se renforcer jusqu\u27Ă  parfois devenir irrĂ©versible. Il en dĂ©coule nos deux questions de recherche : dans quelle mesure le choix d’une stratĂ©gie de protection de l’innovation favorise-t-il ultĂ©rieurement le choix d’une stratĂ©gie identique et quels sont les facteurs influençant le changement de stratĂ©gie de protection de l’innovation ?  Pour y rĂ©pondre, nous avons recours Ă  l\u27analyse des donnĂ©es issues des enquĂȘtes CIS 4 et CIS2006. GrĂące Ă  l\u27utilisation de modĂšles logistique et probit, nous suivons l\u27Ă©volution des choix d\u27entreprises sur deux pĂ©riodes. Que les entreprises utilisent les mĂ©thodes informelles, une combinaison de mĂ©thodes (brevet et mĂ©thodes informelles) ou mĂȘme n\u27utilisent aucune mĂ©thode, nous trouvons que le choix considĂ©rĂ© est trĂšs nettement dĂ©pendant du fait qu\u27un choix identique ait Ă©tĂ© effectuĂ© dans la pĂ©riode prĂ©cĂ©dente. L\u27utilisation du brevet seul ne semble en revanche pas sujet au mĂȘme Ă©tat de dĂ©pendance, ce qui fait apparaĂźtre cette stratĂ©gie de protection comme davantage transitoire ou instable. En plus de l\u27inertie constatĂ©e dans les choix, les rĂ©sultats montrent clairement une nette diffĂ©rence entre brevet et mĂ©thodes informelles. Alors que le choix d\u27utiliser le brevet est peu sensible aux modifications d\u27autres variables, l\u27utilisation des mĂ©thodes informelles se montre quant Ă  elle beaucoup plus changeante, rejoignant l\u27idĂ©e d\u27une certaine souplesse dans son utilisation

    Extreme value distributions and Renormalization Group

    Get PDF
    In the classical theorems of extreme value theory the limits of suitably rescaled maxima of sequences of independent, identically distributed random variables are studied. So far, only affine rescalings have been considered. We show, however, that more general rescalings are natural and lead to new limit distributions, apart from the Gumbel, Weibull, and Fr\'echet families. The problem is approached using the language of Renormalization Group transformations in the space of probability densities. The limit distributions are fixed points of the transformation and the study of the differential around them allows a local analysis of the domains of attraction and the computation of finite-size corrections.Comment: 16 pages, 5 figures. Final versio

    Using Extreme Value Theory for Determining the Probability of Carrington-Like Solar Flares

    Get PDF
    Space weather events can negatively affect satellites, the electricity grid, satellite navigation systems and human health. As a consequence, extreme space weather has been added to the UK and other national risk registers. By their very nature, extreme space weather events occur rarely and, therefore, statistical methods are required to determine the probability of their occurrence. Space weather events can be characterised by a number of natural phenomena such as X-ray (solar) flares, solar energetic particle (SEP) fluxes, coronal mass ejections and various geophysical indices (Dst, Kp, F10.7). In this paper extreme value theory (EVT) is used to investigate the probability of extreme solar flares. Previous work has assumed that the distribution of solar flares follows a power law. However such an approach can lead to a poor estimation of the return times of such events due to uncertainties in the tails of the probability distribution function. Using EVT and GOES X-ray flux data it is shown that the expected 150-year return level is approximately an X60 flare whilst a Carrington-like flare is a one in a 100-year event. It is also shown that the EVT results are consistent with flare data from the Kepler space telescope mission.Comment: 13 pages, 4 figures; updated content following reviewer feedbac

    A direct route to cyclic organic nanostructures via ring-expansion metathesis polymerization of a dendronized macromonomer

    Get PDF
    Cyclic organic nanostructures were prepared via ring-expansion metathesis polymerization of a dendronized norbornene macromonomer. The strategy provides a direct, efficient route to nanoscale rings in a single operation. AFM imaging confirmed toroidal features having diameters of ca. 35−40 nm

    Ordered spectral statistics in 1D disordered supersymmetric quantum mechanics and Sinai diffusion with dilute absorbers

    Full text link
    Some results on the ordered statistics of eigenvalues for one-dimensional random Schr\"odinger Hamiltonians are reviewed. In the case of supersymmetric quantum mechanics with disorder, the existence of low energy delocalized states induces eigenvalue correlations and makes the ordered statistics problem nontrivial. The resulting distributions are used to analyze the problem of classical diffusion in a random force field (Sinai problem) in the presence of weakly concentrated absorbers. It is shown that the slowly decaying averaged return probability of the Sinai problem, \mean{P(x,t|x,0)}\sim \ln^{-2}t, is converted into a power law decay, \mean{P(x,t|x,0)}\sim t^{-\sqrt{2\rho/g}}, where gg is the strength of the random force field and ρ\rho the density of absorbers.Comment: 10 pages ; LaTeX ; 4 pdf figures ; Proceedings of the meeting "Fundations and Applications of non-equilibrium statistical mechanics", Nordita, Stockholm, october 2011 ; v2: appendix added ; v3: figure 2.left adde

    Assessment of multireference approaches to explicitly correlated full configuration interaction quantum Monte Carlo.

    Get PDF
    The Full Configuration Interaction Quantum Monte Carlo (FCIQMC) method has proved able to provide near-exact solutions to the electronic Schrödinger equation within a finite orbital basis set, without relying on an expansion about a reference state. However, a drawback to the approach is that being based on an expansion of Slater determinants, the FCIQMC method suffers from a basis set incompleteness error that decays very slowly with the size of the employed single particle basis. The FCIQMC results obtained in a small basis set can be improved significantly with explicitly correlated techniques. Here, we present a study that assesses and compares two contrasting "universal" explicitly correlated approaches that fit into the FCIQMC framework: the [2]R12 method of Kong and Valeev [J. Chem. Phys. 135, 214105 (2011)] and the explicitly correlated canonical transcorrelation approach of Yanai and Shiozaki [J. Chem. Phys. 136, 084107 (2012)]. The former is an a posteriori internally contracted perturbative approach, while the latter transforms the Hamiltonian prior to the FCIQMC simulation. These comparisons are made across the 55 molecules of the G1 standard set. We found that both methods consistently reduce the basis set incompleteness, for accurate atomization energies in small basis sets, reducing the error from 28 mEh to 3-4 mEh. While many of the conclusions hold in general for any combination of multireference approaches with these methodologies, we also consider FCIQMC-specific advantages of each approach.Royal Societ
    • 

    corecore